Pseudo Orthogonal Bases Give the Optimal Generalization Capability in Neural Network Learning

نویسندگان

  • Masashi Sugiyama
  • Hidemitsu Ogawa
چکیده

Pseudo orthogonal bases are a certain type of frames proposed in the engineering field, whose concept is equivalent to a tight frame with frame bound 1 in the frame terminology. This paper shows that pseudo orthogonal bases play an essential role in neural network learning. One of the most important issues in neural network learning is “what training data provides the optimal generalization capability?”, which is referred to as active learning in the neural network community. We derive a necessary and sufficient condition of training data to provide the optimal generalization capability in the trigonometric polynomial space, where the concept of pseudo orthogonal bases is essential. By utilizing useful properties of pseudo orthogonal bases, we clarify the mechanism of achieving the optimal generalization.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Pseudo Orthogonal Bases Give the Optimal Solution to Active Learning in Neural Networks

Pseudo orthogonal bases are a certain type of frames proposed in the engineering field, whose concept is equivalent to a normalized tight frame in the frame terminology. This paper shows that pseudo orthogonal bases play an essential role in neural network learning. One of the most important issues in neural network learning is “what training data provides the optimal generalization capability?...

متن کامل

Active Learning for Optimal Generalization in Trigonometric Polynomial Models

In this paper, we consider the problem of active learning, and give a necessary and sufficient condition of sample points for the optimal generalization capability. By utilizing the properties of pseudo orthogonal bases, we clarify the mechanism of achieving the optimal generalization capability. We also show that the condition does not only provide the optimal generalization capability but als...

متن کامل

Two Novel Learning Algorithms for CMAC Neural Network Based on Changeable Learning Rate

Cerebellar Model Articulation Controller Neural Network is a computational model of cerebellum which acts as a lookup table. The advantages of CMAC are fast learning convergence, and capability of mapping nonlinear functions due to its local generalization of weight updating, single structure and easy processing. In the training phase, the disadvantage of some CMAC models is unstable phenomenon...

متن کامل

A Computational Study of Incremental Projection Learning in Neural Networks

One of the essences of supervised learning in neural network is generalization capability. It is an ability to give an accurate result for data that are not learned in learning process. One of supervised learning method that theoretically guarantees the optimal generalization capability is projection learning. The method was formulated as inverse problem from functional analytic point of view i...

متن کامل

A multi-scale convolutional neural network for automatic cloud and cloud shadow detection from Gaofen-1 images

The reconstruction of the information contaminated by cloud and cloud shadow is an important step in pre-processing of high-resolution satellite images. The cloud and cloud shadow automatic segmentation could be the first step in the process of reconstructing the information contaminated by cloud and cloud shadow. This stage is a remarkable challenge due to the relatively inefficient performanc...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007